PAC-Bayes Generalization Bounds for Randomized Structured Prediction

نویسندگان

  • Ben London
  • Bert Huang
  • Ben Taskar
  • Lise Getoor
چکیده

We present a new PAC-Bayes generalization bound for structured prediction that is applicable to perturbation-based probabilistic models. Our analysis explores the relationship between perturbation-based modeling and the PAC-Bayes framework, and connects to recently introduced generalization bounds for structured prediction. We obtain the first PAC-Bayes bounds that guarantee better generalization as the size of each structured example grows.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Chromatic PAC-Bayes Bounds for Non-IID Data

PAC-Bayes bounds are among the most accurate generalization bounds for classifiers learned with IID data, and it is particularly so for margin classifiers. However, there are many practical cases where the training data show some dependencies and where the traditional IID assumption does not apply. Stating generalization bounds for such frameworks is therefore of the utmost interest, both from ...

متن کامل

Entropy-SGD optimizes the prior of a PAC-Bayes bound: Data-dependent PAC-Bayes priors via differential privacy

We show that Entropy-SGD (Chaudhari et al., 2017), when viewed as a learning algorithm, optimizes a PAC-Bayes bound on the risk of a Gibbs (posterior) classifier, i.e., a randomized classifier obtained by a risk-sensitive perturbation of the weights of a learned classifier. Entropy-SGD works by optimizing the bound’s prior, violating the hypothesis of the PAC-Bayes theorem that the prior is cho...

متن کامل

Chromatic PAC-Bayes Bounds for Non-IID Data: Applications to Ranking and Stationary β-Mixing Processes

PAC-Bayes bounds are among the most accurate generalization bounds for classifiers learned from independently and identically distributed (IID) data, and it is particularly so for margin classifiers: there have been recent contributions showing how practical these bounds can be either to perform model selection (Ambroladze et al., 2007) or even to directly guide the learning of linear classifie...

متن کامل

Combining PAC-Bayesian and Generic Chaining Bounds

There exist many different generalization error bounds in statistical learning theory. Each of these bounds contains an improvement over the others for certain situations or algorithms. Our goal is, first, to underline the links between these bounds, and second, to combine the different improvements into a single bound. In particular we combine the PAC-Bayes approach introduced by McAllester (1...

متن کامل

A PAC-Bayesian Analysis of Randomized Learning with Application to Stochastic Gradient Descent

We analyze the generalization error of randomized learning algorithms—focusingon stochastic gradient descent (SGD)—using a novel combination of PAC-Bayesand algorithmic stability. Importantly, our risk bounds hold for all posterior dis-tributions on the algorithm’s random hyperparameters, including distributions thatdepend on the training data. This inspires an adaptive sampling...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013